Skip to main content
Version: 4.2

Relational Data Source

info

Parameters needed for source connection for SQL Server, Oracle, PostgreSQL and Damon database are similar, and SQL Server is used as an example.

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Auto creation of metadata collection taskWhether to generate a metadata collection task automatically in X-DAM. It is enabled and the collection is set to 1:00 by default.
Connection type
  • Address/port: Enter the database server address and port, and database name for connection.
  • JDBC: Enter JDBC connection string for connection.

Interface Data Source

  • WebService

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
AddressWebService interface address. Click Parse after entering the address, and then X-ETL automatically starts parsing the address.
Http authorizationEnable HTTP authorization and enter username and password if the interface needs authentication.
MethodSelect corresponding methods based on the address parsing results.
Parse pathData path in response data XML file. For example: Body/OutputParameters/P_OUT_CONTENT/HEADERS/HEADER.
Input modeDisplays corresponding input parameters based on the selected mode.
  • List mode: Click New and enter parameter information of the interface under Input Param.
  • Xml mode: Configure data interface by entering xml text.
Output modeDisplays corresponding output parameters based on the selected mode.
  • Parse means to parse strings to display parameter list.
  • Do not parse means to return JSON strings.
  • Restful

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
AddressRESTful interface address.
Http methodSelect request methods.
Max. single transfer data volumeThe maximum data volume returned at one time.
Http authorizationEnable HTTP authorization and enter username and password if the interface needs authentication.
Headers paramHeader parameter of HTTP request.
Body paramBody parameter of HTTP request.
  • FormData: Form submit method. You can select parameter type from Object and List, and click New to add parameters.
  • JSON: JSON submit method. You need to enter input parameters in JSON format.
info

When setting GET or DELETE as HTTP Method, only Form-data parameter is available.

Param typeDisplays corresponding input parameters based on the selected mode.
  • Object: Click New and enter parameter information of the interface under Input Param.
  • List: Configure data interface by entering xml text.
Output param configEnable it to verify parameter of upstream interface. Click fields on the output param window as output results, and result values are automatically filled. When parsing the response data, comparison between result and response data starts and once difference occurs, connection fails.
Output type
  • JSON: Output content is in JSON.
  • XML: Output content is in XML. You need to enter parse path for parsing the content.
Output resultArrays or objects you selected from response result to be the ultimate output results.

Object Model Data Source

ParameterDescription
Local Data SourceEnable it to set data source to the current supOS platform.
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Auto creation of metadata collection taskEnable it to automatically generate a metadata collection task in X-DAM. It is enabled and the collection is set to 1:00 by default.
supOS AddressSource supOS address and port.
supOS TenantEnter tenant name when the supOS is multi-tenant version.
supOS AuthUse the generated AK/SK credential on the source supOS for data acquisition.

File Data Source

  • Local file directory

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Database file pathEnter the absolute path under /volumes/supfiles/ in local storage. If the path does not exist, the platform will create it automatically.
tip

The path is created under the dam-etl-xxxxx pod, and you can use the following commands to view files inside.

kubectl get po |grep dam-etl   //find the pod
kubectl exec -it dam-etl-xxxx bash //log in to the pod
cd /volumes/supfiles/your_path //view the files
  • SMB

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Service addressIP address of the data source server.
Smb file pathThe absolute path of the data source file to be connected.
Username/PasswordThe username and password to log in to the data source.
  • HDFS

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Server address/port/UsernameThe IP and port of the Hadoop server and the username used for login.
Relative file pathDirectory relative path on the Hadoop server. HDFS file generated during data transfer from Hive is stored under it.
  • FTP

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Server addressIP address of the FTP server.
Ftp file pathThe absolute path of the data file to be connected.
Username/PasswordThe username and password used for login.
  • MINIO

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Service addressIP address of the MINIO server.
Bucket nameName of the MINIO bucket to be connected.
File pathThe absolute path of the data file to be connected.
Username/PasswordThe username and password used for login.
  • OSS

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Service addressIP address of the OSS server.
Bucket nameName of the OSS bucket to be connected.
File pathThe absolute path of the data file to be connected.
Username/PasswordThe username and password used for login.

Big Data Storage

  • Hive

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Connection type
  • Address/port: Enter the database server address and port, and database name for connection.
  • JDBC: Enter JDBC connection string for connection.
Username/PasswordThe username and password used for database login.
  • HBase

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Zookeeper IP/Host/PortIP address of Zookeeper. Use comma to separate IPs for cluster deployment.
Master IP/HostMaster IP or hostname of Zookeeper cluster. Leave it empty for independent deployment.
Special auth methodFile path of Kerberos authentication is necessary when selecting Kerberos.
info

When adding HBase as data source, the following configurations on the X-ETL backend are needed.

  1. Add HBase cluster deployment network configurations to dam-resource, flink-jobmanager-batch and flink-taskmanager-batch.
kubectl edit deploy dam-resource
  1. Press i to start editing, and add the following content between spec and containers.
hostAliases:
- ip: 192.168.x.x //hadoop cluster IP
hostnames:
- "ubuntu-hadoop01" //hadoop cluster hostname
- ip: 192.168.x.x
hostnames:
- "ubuntu-hadoop02"
- ip: 192.168.x.x
hostnames:
- "ubuntu-hadoop03"
- ip: 192.168.x.x
hostnames:
- "ubuntu-hadoop04"

  1. Finish editing and do the same for flink-jobmanager-batch and flink-taskmanager-batch.
  2. Check the container ID, and then access the container.
kubectl get po
kubectl exec -it dam-resource-5b558bfdd7-kxgv2 bash

  1. Inside the container, execute the following content to check whether the mapping is added in hosts file.
cat /etc/hosts
  1. Exit the container.

Message Queue

  • Kafka

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Server addressIP address and port of the Kafka server.
  • MQTT

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
Server addressIP address and port of the MQTT server in the format of tcp://ip:port.
  • RocketMQ

ParameterDescription
Affiliated business systemDisplay category of data sources under Business system on Data Connection page. You can add custom systems as you need.
RacketMQ hostHost IP and port of the RocketMQ to be connected.
TimeoutSpecified connection timeout duration. The default is 5,000 ms.